Deep Learning from Scratch: From Basics to Building Real Neural Networks in Python with Keras by Artem Kovera
Author:Artem Kovera [Kovera, Artem]
Language: eng
Format: azw3
Published: 2018-11-08T16:00:00+00:00
A pooling layer has no learnable parameters. Because of this, pooling layers are usually not included in the total number of layers of convolutional networks. For example, if a neural network has seven convolutional layers, three fully-connected layers, one soft-max layer, and four pooling layers, we say that this is an 11-layers deep neural network (7+3+1) because we only count layers with learnable parameters.
In some recent convolutional neural network architectures, fully-connected layers in the end of the networks are replaced by several average pooling layers. This allows these networks to drastically reduce the total number of learnable parameters in them, which leads to better overfitting prevention.
In addition to convolutional neural networks, there are also so-called deconvolutional neural networks. These networks utilize special operations called deconvolutions or transposed convolutions.
Deconvolutional neural networks usually use a method called unpooling or unsampling in their unpooling layers. Basically, it is the opposite of what pooling does. The output of an unpooling layer is an enlarged but sparse feature map. After unpooling, a deconvolutional network uses its deconvolutional layer. In this layer, the network densifies the sparse feature map coming from the unpooling layer using transposed convolutions. As a result, deconvolutional neural networks are able to increase image resolution or, speaking more generally, dimensionality of their inputs. Deconvolutional neural networks can be applied for generative models, for example.
The same effects of dimensionality input increasing that are provided by deconvilutional networks can be also achieved using convolutional networks with appropriate convolutional layers. So, we don’t necessary need to use deconvolutional neural networks. Also, often networks which have deconvolutional layers are simply called convolutional.
Download
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.
The Mikado Method by Ola Ellnestam Daniel Brolund(20973)
Hello! Python by Anthony Briggs(20251)
Secrets of the JavaScript Ninja by John Resig Bear Bibeault(18611)
Dependency Injection in .NET by Mark Seemann(18406)
The Well-Grounded Java Developer by Benjamin J. Evans Martijn Verburg(17929)
OCA Java SE 8 Programmer I Certification Guide by Mala Gupta(17689)
Kotlin in Action by Dmitry Jemerov(17587)
Adobe Camera Raw For Digital Photographers Only by Rob Sheppard(16938)
Algorithms of the Intelligent Web by Haralambos Marmanis;Dmitry Babenko(16508)
Grails in Action by Glen Smith Peter Ledbrook(15651)
Sass and Compass in Action by Wynn Netherland Nathan Weizenbaum Chris Eppstein Brandon Mathis(13464)
Secrets of the JavaScript Ninja by John Resig & Bear Bibeault(11538)
A Developer's Guide to Building Resilient Cloud Applications with Azure by Hamida Rebai Trabelsi(10582)
Test-Driven iOS Development with Swift 4 by Dominik Hauser(10484)
Jquery UI in Action : Master the concepts Of Jquery UI: A Step By Step Approach by ANMOL GOYAL(9519)
Hit Refresh by Satya Nadella(9090)
The Kubernetes Operator Framework Book by Michael Dame(8526)
Exploring Deepfakes by Bryan Lyon and Matt Tora(8349)
Robo-Advisor with Python by Aki Ranin(8296)